Bayesian fused lasso modeling via horseshoe prior
نویسندگان
چکیده
Abstract Bayesian fused lasso is one of the sparse methods, which shrinks both regression coefficients and their successive differences simultaneously. In this paper, we propose a modeling via horseshoe prior. By assuming prior on difference coefficients, proposed method enables us to prevent over-shrinkage those differences. We also nearly hexagonal operator for with shrinkage equality selection prior, imposes priors all combinations coefficients. Simulation studies an application real data show that gives better performance than existing methods.
منابع مشابه
Sparsity and smoothness via the fused lasso
The lasso penalizes a least squares regression by the sum of the absolute values (L1-norm) of the coefficients. The form of this penalty encourages sparse solutions (with many coefficients equal to 0).We propose the ‘fused lasso’, a generalization that is designed for problems with features that can be ordered in some meaningful way. The fused lasso penalizes the L1-norm of both the coefficient...
متن کاملFused Multiple Graphical Lasso
In this paper, we consider the problem of estimating multiple graphical models simultaneously using the fused lasso penalty, which encourages adjacent graphs to share similar structures. A motivating example is the analysis of brain networks of Alzheimer’s disease using neuroimaging data. Specifically, we may wish to estimate a brain network for the normal controls (NC), a brain network for the...
متن کاملFused Lasso Additive Model.
We consider the problem of predicting an outcome variable using p covariates that are measured on n independent observations, in a setting in which additive, flexible, and interpretable fits are desired. We propose the fused lasso additive model (FLAM), in which each additive function is estimated to be piecewise constant with a small number of adaptively-chosen knots. FLAM is the solution to a...
متن کاملPairwise Fused Lasso
In the last decade several estimators have been proposed that enforce the grouping property. A regularized estimate exhibits the grouping property if it selects groups of highly correlated predictor rather than selecting one representative. The pairwise fused lasso is related to fusion methods but does not assume that predictors have to be ordered. By penalizing parameters and differences betwe...
متن کاملA Bayesian Lasso via reversible-jump MCMC
Variable selection is a topic of great importance in high-dimensional statistical modeling and has a wide range of real-world applications. Many variable selection techniques have been proposed in the context of linear regression, and the Lasso model is probably one of the most popular penalized regression techniques. In this paper, we propose a new, fully hierarchical, Bayesian version of the ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Japanese Journal of Statistics and Data Science
سال: 2023
ISSN: ['2520-8764', '2520-8756']
DOI: https://doi.org/10.1007/s42081-023-00213-2